282 research outputs found

    The consistency of empirical comparisons of regression and analogy-based software project cost prediction

    Get PDF
    OBJECTIVE - to determine the consistency within and between results in empirical studies of software engineering cost estimation. We focus on regression and analogy techniques as these are commonly used. METHOD – we conducted an exhaustive search using predefined inclusion and exclusion criteria and identified 67 journal papers and 104 conference papers. From this sample we identified 11 journal papers and 9 conference papers that used both methods. RESULTS – our analysis found that about 25% of studies were internally inconclusive. We also found that there is approximately equal evidence in favour of, and against analogy-based methods. CONCLUSIONS – we confirm the lack of consistency in the findings and argue that this inconsistent pattern from 20 different studies comparing regression and analogy is somewhat disturbing. It suggests that we need to ask more detailed questions than just: “What is the best prediction system?

    Development and formative evaluation of the e-Health implementation toolkit

    Get PDF
    <b>Background</b> The use of Information and Communication Technology (ICT) or e-Health is seen as essential for a modern, cost-effective health service. However, there are well documented problems with implementation of e-Health initiatives, despite the existence of a great deal of research into how best to implement e-Health (an example of the gap between research and practice). This paper reports on the development and formative evaluation of an e-Health Implementation Toolkit (e-HIT) which aims to summarise and synthesise new and existing research on implementation of e-Health initiatives, and present it to senior managers in a user-friendly format.<p></p> <b>Results</b> The content of the e-HIT was derived by combining data from a systematic review of reviews of barriers and facilitators to implementation of e-Health initiatives with qualitative data derived from interviews of "implementers", that is people who had been charged with implementing an e-Health initiative. These data were summarised, synthesised and combined with the constructs from the Normalisation Process Model. The software for the toolkit was developed by a commercial company (RocketScience). Formative evaluation was undertaken by obtaining user feedback. There are three components to the toolkit - a section on background and instructions for use aimed at novice users; the toolkit itself; and the report generated by completing the toolkit. It is available to download from http://www.ucl.ac.uk/pcph/research/ehealth/documents/e-HIT.xls<p></p> <b>Conclusions</b> The e-HIT shows potential as a tool for enhancing future e-Health implementations. Further work is needed to make it fully web-enabled, and to determine its predictive potential for future implementations

    Fragmentation patterns and personalized sequencing of cell-free DNA in urine and plasma of glioma patients.

    Get PDF
    Glioma-derived cell-free DNA (cfDNA) is challenging to detect using liquid biopsy because quantities in body fluids are low. We determined the glioma-derived DNA fraction in cerebrospinal fluid (CSF), plasma, and urine samples from patients using sequencing of personalized capture panels guided by analysis of matched tumor biopsies. By sequencing cfDNA across thousands of mutations, identified individually in each patient's tumor, we detected tumor-derived DNA in the majority of CSF (7/8), plasma (10/12), and urine samples (10/16), with a median tumor fraction of 6.4 × 10-3 , 3.1 × 10-5 , and 4.7 × 10-5 , respectively. We identified a shift in the size distribution of tumor-derived cfDNA fragments in these body fluids. We further analyzed cfDNA fragment sizes using whole-genome sequencing, in urine samples from 35 glioma patients, 27 individuals with non-malignant brain disorders, and 26 healthy individuals. cfDNA in urine of glioma patients was significantly more fragmented compared to urine from patients with non-malignant brain disorders (P = 1.7 × 10-2 ) and healthy individuals (P = 5.2 × 10-9 ). Machine learning models integrating fragment length could differentiate urine samples from glioma patients (AUC = 0.80-0.91) suggesting possibilities for truly non-invasive cancer detection

    Proteomic identification of secreted proteins as surrogate markers for signal transduction inhibitor activity

    Get PDF
    Epidermal growth factor receptor is a potential target for cancer treatment and new small-molecule tyrosine kinase inhibitor drugs have been designed to inhibit its activity. In this work we identify potential surrogate markers of drug activity using a proteomic analysis. Two-dimensional electrophoresis was optimised to compare expression patterns of proteins secreted from the cancer cell lines A431 and A549 treated with Gefitinib (Iressa) vs untreated or vehicle-only-treated samples. Upregulated or downregulated proteins were detected using Phoretix 2D image analysis software. Several proteins were then identified using matrix-assisted laser desorption/ionization-time of flight (MALDI-TOF) mass spectrometry. In one case, upregulation of Protein Disulphide Isomerase in response to Gefitinib was confirmed by Western blot analysis, and the response was shown to be concentration dependent. The identification of surrogate markers may be of use for the evaluation of new drugs, in preclinical models, in clinical trials and in the therapy of individual patients to give optimal biological drug doses

    Relationship between B-type natriuretic peptide levels and echocardiographic indices of left ventricular filling pressures in post-cardiac surgery patients

    Get PDF
    <p>Abstract</p> <p>Background</p> <p>B-type natriuretic peptide (BNP) is increased in post-cardiac surgery patients, however the mechanisms underlying BNP release are still unclear. In the current study, we aimed to assess the relationship between postoperative BNP levels and left ventricular filling pressures in post-cardiac surgery patients.</p> <p>Methods</p> <p>We prospectively enrolled 134 consecutive patients referred to our Center 8 ± 5 days after cardiac surgery. BNP was sampled at hospital admission and related to the following echocardiographic parameters: left ventricular (LV) diastolic volume (DV), LV systolic volume (SV), LV ejection fraction (EF), LV mass, relative wall thickness (RWT), indexed left atrial volume (<sub>i</sub>LAV), mitral inflow E/A ratio, mitral E wave deceleration time (DT), ratio of the transmitral E wave to the Doppler tissue early mitral annulus velocity (E/E').</p> <p>Results</p> <p>A total of 124 patients had both BNP and echocardiographic data. The BNP values were significantly elevated (mean 353 ± 356 pg/ml), with normal value in only 17 patients (13.7%). Mean LVEF was 59 ± 10% (LVEF ≥50% in 108 pts). There was no relationship between BNP and LVEF (p = 0.11), LVDV (p = 0.88), LVSV (p = 0.50), E/A (p = 0.77), DT (p = 0.33) or RWT (p = 0.50). In contrast, BNP was directly related to E/E' (p < 0.001), LV mass (p = 0.006) and <sub>i</sub>LAV (p = 0.026). At multivariable regression analysis, age and E/E' were the only independent predictors of BNP levels.</p> <p>Conclusion</p> <p>In post-cardiac surgery patients with overall preserved LV systolic function, the significant increase in BNP levels is related to E/E', an echocardiographic parameter of elevated LV filling pressures which indicates left atrial pressure as a major determinant in BNP release in this clinical setting.</p

    Translational framework for implementation evaluation and research: Protocol for a qualitative systematic review of studies informed by Normalization Process Theory (NPT) [version 1; peer review: 2 approved].

    Get PDF
    Background: Normalization Process Theory (NPT) identifies mechanisms that have been demonstrated to play an important role in implementation processes. It is now widely used to inform feasibility, process evaluation, and implementation studies in healthcare and other areas of work. This qualitative synthesis of NPT studies aims to better understand how NPT explains observed and reported implementation processes, and to explore the ways in which its constructs explain the implementability, enacting and sustainment of complex healthcare interventions. Methods: We will systematically search Scopus, PubMed and Web of Science databases and use the Google Scholar search engine for citations of key papers in which NPT was developed. This will identify English language peer-reviewed articles in scientific journals reporting (a) primary qualitative or mixed methods studies; or, (b) qualitative or mixed methods evidence syntheses in which NPT was the primary analytic framework. Studies may be conducted in any healthcare setting, published between June 2006 and 31 December 2021. We will perform a qualitative synthesis of included studies using two parallel methods: (i) directed content analysis based on an already developed coding manual; and (ii) unsupervised textual analysis using Leximancer® topic modelling software. Other: We will disseminate results of the review using peer reviewed publications, conference and seminar presentations, and social media (Facebook and Twitter) channels. The primary source of funding is the National Institute for Health Research ARC North Thames. No human subjects or personal data are involved and no ethical issues are anticipated

    Enhanced detection of circulating tumor DNA by fragment size analysis.

    Get PDF
    Existing methods to improve detection of circulating tumor DNA (ctDNA) have focused on genomic alterations but have rarely considered the biological properties of plasma cell-free DNA (cfDNA). We hypothesized that differences in fragment lengths of circulating DNA could be exploited to enhance sensitivity for detecting the presence of ctDNA and for noninvasive genomic analysis of cancer. We surveyed ctDNA fragment sizes in 344 plasma samples from 200 patients with cancer using low-pass whole-genome sequencing (0.4×). To establish the size distribution of mutant ctDNA, tumor-guided personalized deep sequencing was performed in 19 patients. We detected enrichment of ctDNA in fragment sizes between 90 and 150 bp and developed methods for in vitro and in silico size selection of these fragments. Selecting fragments between 90 and 150 bp improved detection of tumor DNA, with more than twofold median enrichment in >95% of cases and more than fourfold enrichment in >10% of cases. Analysis of size-selected cfDNA identified clinically actionable mutations and copy number alterations that were otherwise not detected. Identification of plasma samples from patients with advanced cancer was improved by predictive models integrating fragment length and copy number analysis of cfDNA, with area under the curve (AUC) >0.99 compared to AUC 0.91 compared to AUC < 0.5 without fragmentation features. Fragment size analysis and selective sequencing of specific fragment sizes can boost ctDNA detection and could complement or provide an alternative to deeper sequencing of cfDNA.We would like to acknowledge the support of The University of Cambridge, Cancer Research UK and the EPSRC (CRUK grant numbers A11906 (NR), A20240 (NR), A22905 (JDB), A15601 (JDB), A25177 (CRUK Cancer Centre Cambridge), A17242 (KMB), A16465 (CRUK-EPSRC Imaging Centre in Cambridge and Manchester)). The research leading to these results has received funding from the European Research Council under the European Union's Seventh Framework Programme (FP/2007-2013) / ERC Grant Agreement n. 337905. The research was supported by the National Institute for Health Research Cambridge, National Cancer Research Network, Cambridge Experimental Cancer Medicine Centre and Hutchison Whampoa Limited. This research is also supported by Target Ovarian Cancer and the Medical Research Council through their Joint Clinical Research Training Fellowship for Dr Moore. The CALIBRATE study was supported by funding from AstraZeneca

    Comprehensive or Comprehensible Experience? A Case Study of Religion and Traumatic Bereavement

    Get PDF
    The first half of this article provides a brief overview of two respective projects concerning traumatic bereavement, in which religious faith appeared to feature amid a constellation of significant coping and sense-making mechanisms for survivors. After presenting some illustrative examples of the kind of data produced in the course of our research, the second half of the article develops a retrospectively critical appraisal of our data collection and corresponding analysis practices. In questioning the extent to which our accounts of our participants’ accounts can be considered adequate representations of social order, we critically explore the relative potential of ‘reflexivity’ for bridging the experiential gap between researchers and participants. Taken together, these reflections prompt a return to the salutary question: what counts as sociologically ‘see-able’

    Characteristics of successfully implemented telemedical applications

    Get PDF
    <p>Abstract</p> <p>Background</p> <p>There has been an increased interest in the use of telemedical applications in clinical practice in recent years. Considerable effort has been invested in trials and experimental services. Yet, surprisingly few applications have continued beyond the research and development phase. The aim of this study is to explore characteristics of successfully implemented telemedical applications.</p> <p>Methods</p> <p>An extensive search of telemedicine literature was conducted in order to identify relevant articles. Following a defined selection process, a small number of articles were identified that described characteristics of successfully implemented telemedical applications. These articles were analysed qualitatively, drawing on central procedures from Grounded Theory (GT), including condensation and categorisation. The analysis resulted in a description of features found to be of importance for a successful implementation of telemedicine. Subsequently, these features were discussed in light of Science and Technology studies (STS) and the concept of 'social negotiation'.</p> <p>Results</p> <p>Telemedical applications introduced into routine practice are typically characterised by the following six features: 1) local service delivery problems have been clearly stated, 2) telemedicine has been seen as a benefit, 3) telemedicine has been seen as a solution to political and medical issues, 4) there was collaboration between promoters and users, 5) issues regarding organizational and technological arrangements have been addressed, and 6) the future operation of the service has been considered.</p> <p>Conclusion</p> <p>Our findings support research arguing that technologies are not fixed entities moving from invention through diffusion and into routine use. Rather, it is the interplay between technical and social factors that produces a particular outcome. The success of a technology depends on how this interplay is managed during the process of implementation.</p
    • …
    corecore